Minimum Hellinger Distance Estimation with Inlier Modification
نویسندگان
چکیده
Inference procedures based on the Hellinger distance provide attractive alternatives to likelihood based methods for the statistician. The minimum Hellinger distance estimator has full asymptotic efficiency under the model together with strong robustness properties under model misspecification. However, the Hellinger distance puts too large a weight on the inliers which appears to be the main reason for the poor efficiency of the method in small samples. Here some modifications to the inlier part of the Hellinger distance are provided which lead to substantial improvements in the small sample properties of the estimators. The modified divergences are members of the general class of disparities and satisfy the necessary regularity conditions so that the asymptotic properties of the resulting estimators follow from standard theory. In limited simulations the proposed estimators exhibit better small sample performance at the model and competitive robustness properties in relation to the ordinary minimum Hellinger distance estimator. As the asymptotic efficiencies of the modified estimators are the same as that of the ordinary estimator, the new procedures are expected to be useful tools for applied statisticians and data analysts. AMS (2000) subject classification. Primary To be filled.
منابع مشابه
Efficient Hellinger distance estimates for semiparametric models
Minimum distance techniques have become increasingly important tools for solving statistical estimation and inference problems. In particular, the successful application of the Hellinger distance approach to fully parametric models is well known. The corresponding optimal estimators, known as minimum Hellinger distance estimators, achieve efficiency at the model density and simultaneously posse...
متن کاملA minimum Hellinger distance estimator for stochastic differential equations: An application to statistical inference for continuous time interest rate models
A Minimum Disparity Distance Estimator minimizes a φ-divergence between the marginal density of a parametric model and its non-parametric estimate. This principle is applied to the estimation of stochastic differential equation models, choosing the Hellinger distance as particular φ−divergence. Under an hypothesis of stationarity, the parametric marginal density is provided by solving the Kolmo...
متن کاملComparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملMinimum Profile Hellinger Distance Estimation For A Semiparametric Mixture Model
In this paper, we propose a new effective estimator for a class of semiparametric mixture models where one component has known distribution with possibly unknown parameters while the other component density and the mixing proportion are unknown. Such semiparametric mixture models have been often used in multiple hypothesis testing and the sequential clustering algorithm. The proposed estimator ...
متن کاملParameter Estimation of Some Archimedean Copulas Based on Minimum Cramér-von-Mises Distance
The purpose of this paper is to introduce a new estimation method for estimating the Archimedean copula dependence parameter in the non-parametric setting. The estimation of the dependence parameter has been selected as the value that minimizes the Cramér-von-Mises distance which measures the distance between Empirical Bernstein Kendall distribution function and true Kendall distribution functi...
متن کامل